Here is the bug report number I filed: FB9090982.
Might be worth submitting your issues as well. I see there is going to be a 14.5.1 release. Maybe there is a chance to get a fix for this in there.
Post
Replies
Boosts
Views
Activity
I have a support incident open with Apple and also have filed a bug report. They had me try using VNVideoProcessor. When using this the issue still persists, except instead of throwing an error, it just skips the image and does not call the request. The support engineer claims he is only seeing the issue on a simulator, and not a real device. I am waiting on further feedback to clarify this. I am processing videos by extracting each frame, so I am dealing with many frames. Some frames will process, but most will not. It seems consistent as to which frames do and don't get processed. Prior to iOS/iPadOS 14.5 we had no issues at all. My fear is that this will not get fixed anytime soon since the problem seems to exist as part of the OS itself. Apple has all their libraries baked into the OS so it makes situations like these unsolvable until future OS releases. And even then, the users need to be on that newer OS. The support engineer suggestion if VNVideoProcessor did not work was to file a bug report and try going a layer lower and work with CoreML directly, bypassing the Vision API. If that is going to be the case, I might try to move off of CoreML entirely and work with openCV.
Take a look here for a work around: https://developer.apple.com/forums/thread/677333
We just had a crash with AttributeGraph, but was not on the main thread. Nothing to go on.
Crashed: com.apple.root.utility-qos
0 AttributeGraph 0x1c7d0c7b0 std::1::hashiterator<std::1::hashnode<std::1::hashvaluetype<unsigned long, unsigned char const*>, void*>*> std::1::hashtable<std::1::hashvaluetype<unsigned long, unsigned char const*>, std::1::unorderedmaphasher<unsigned long, std::1::hashvaluetype<unsigned long, unsigned char const*>, std::1::hash<unsigned long>, true>, std::1::unorderedmapequal<unsigned long, std::1::hashvaluetype<unsigned long, unsigned char const*>, std::1::equalto<unsigned long>, true>, std::1::allocator<std::1::_hashvaluetype<unsigned long, unsigned char const*> > >::find<unsigned long>(unsigned long const&) + 72
1 AttributeGraph 0x1c7d0c6d4 AG::(anonymous namespace)::LayoutCache::drainqueue(void*) + 120
2 libdispatch.dylib 0x1a14aa280 dispatchclientcallout + 16
3 libdispatch.dylib 0x1a145c254 dispatchrootqueuedrain + 688
4 libdispatch.dylib 0x1a145c8e4 dispatchworkerthread2 + 124
5 libsystempthread.dylib 0x1e75df568 pthreadwqthread + 212
6 libsystempthread.dylib 0x1e75e2874 start_wqthread + 8
I feel for you. I am working on a Swift UI app, and code actually executes different on iOS 14 from iOS 13.5. I have my work cut out for me tracking down all the differences. Luckily the app has not gone to production yet.
Thanks for all the replies. It's unfortunate that like the other frameworks it's tied to the os version. Since it is an independent framework it would be nice if this could be deployed with the app similar to how Swift libraries were done. One of the frustrating things now is how long before new features, and in the case of something new like SwiftUI, bug fixes, can be adopted by users. Even when iOS14 is available in the Fall, it will be some time before a critical mass of users install it.